Asynchronous Parallel Algorithms for Nonconvex Big-Data Optimization Part II: Complexity and Numerical Results

نویسندگان

  • Loris Cannelli
  • Francisco Facchinei
  • Vyacheslav Kungurtsev
  • Gesualdo Scutari
چکیده

We present complexity and numerical results for a new asynchronous parallel algorithmic method for the minimization of the sum of a smooth nonconvex function and a convex nonsmooth regularizer, subject to both convex and nonconvex constraints. The proposed method hinges on successive convex approximation techniques and a novel probabilistic model that captures key elements of modern computational architectures and asynchronous implementations in a more faithful way than state-of-the-art models. In the companion paper [3] we provided a detailed description on the probabilistic model and gave convergence results for a diminishing stepsize version of our method. Here, we provide theoretical complexity results for a fixed stepsize version of the method and report extensive numerical comparisons on both convex and nonconvex problems demonstrating the efficiency of our approach.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Asynchronous Parallel Algorithms for Nonconvex Big-Data Optimization. Part I: Model and Convergence

We propose a novel asynchronous parallel algorithmic framework for the minimization of the sum of a smooth nonconvex function and a convex nonsmooth regularizer, subject to both convex and nonconvex constraints. The proposed framework hinges on successive convex approximation techniques and a novel probabilistic model that captures key elements of modern computational architectures and asynchro...

متن کامل

Hybrid Random/Deterministic Parallel Algorithms for Nonconvex Big Data Optimization

We propose a decomposition framework for the parallel optimization of the sum of a differentiable (possibly nonconvex) function and a nonsmooth (possibly nonseparable), convex one. The latter term is usually employed to enforce structure in the solution, typically sparsity. The main contribution of this work is a novel parallel, hybrid random/deterministic decomposition scheme wherein, at each ...

متن کامل

Balancing the Communication Load of Asynchronously Parallelized Machine Learning Algorithms

Stochastic Gradient Descent (SGD) is the standard numerical method used to solve the core optimization problem for the vast majority of machine learning (ML) algorithms. In the context of large scale learning, as utilized by many Big Data applications, efficient parallelization of SGD is in the focus of active research. Recently, we were able to show that the asynchronous communication paradigm...

متن کامل

Parallel Asynchronous Stochastic Variance Reduction for Nonconvex Optimization

Nowadays, asynchronous parallel algorithms have received much attention in the optimization field due to the crucial demands for modern large-scale optimization problems. However, most asynchronous algorithms focus on convex problems. Analysis on nonconvex problems is lacking. For the Asynchronous Stochastic Descent (ASGD) algorithm, the best result from (Lian et al., 2015) can only achieve an ...

متن کامل

EVALUATING EFFICIENCY OF BIG-BANG BIG-CRUNCH ALGORITHM IN BENCHMARK ENGINEERING OPTIMIZATION PROBLEMS

Engineering optimization needs easy-to-use and efficient optimization tools that can be employed for practical purposes. In this context, stochastic search techniques have good reputation and wide acceptability as being powerful tools for solving complex engineering optimization problems. However, increased complexity of some metaheuristic algorithms sometimes makes it difficult for engineers t...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017